Web Survey Bibliography
Background: Health knowledge and literacy are among the main determinants of health. Assessment of these issues via Web-based surveys is growing continuously. Research has suggested that approximately one-fifth of respondents submit cribbed answers, or cheat, on factual knowledge items, which may lead to measurement error. However, little is known about methods of discouraging cheating in Web-based surveys on health knowledge. Objective: This study aimed at exploring the usefulness of imposing a survey time limit to prevent help-seeking and cheating. Methods: On the basis of sample size estimation, 94 undergraduate students were randomly assigned in a 1:1 ratio to complete a Web-based survey on nutrition knowledge, with or without a time limit of 15 minutes (30 seconds per item); the topic of nutrition was chosen because of its particular relevance to public health. The questionnaire consisted of two parts. The first was the validated consumer-oriented nutrition knowledge scale (CoNKS) consisting of 20 true/false items; the second was an ad hoc questionnaire (AHQ) containing 10 questions that would be very difficult for people without health care qualifications to answer correctly. It therefore aimed at measuring cribbing and not nutrition knowledge. AHQ items were somewhat encyclopedic and amenable to Web searching, while CoNKS items had more complex wording, so that simple copying/pasting of a question in a search string would not produce an immediate correct answer. Results: A total of 72 of the 94 subjects started the survey. Dropout rates were similar in both groups (11%, 4/35 and 14%, 5/37 in the untimed and timed groups, respectively). Most participants completed the survey from portable devices, such as mobile phones and tablets. To complete the survey, participants in the untimed group took a median 2.3 minutes longer than those in the timed group; the effect size was small (Cohen’s r=.29). Subjects in the untimed group scored significantly higher on CoNKS (mean difference of 1.2 points, P=.008) and the effect size was medium (Cohen’s d=0.67). By contrast, no significant between-group difference in AHQ scores was documented. Unexpectedly high AHQ scores were recorded in 23% (7/31) and 19% (6/32) untimed and timed respondents, respectively, very probably owing to “e-cheating”. Conclusions: Cribbing answers to health knowledge items in researcher-uncontrolled conditions is likely to lead to overestimation of people’s knowledge; this should be considered during the design and implementation of Web-based surveys. Setting a time limit alone may not completely prevent cheating, as some cheats may be very fast in Web searching. More complex and contextualized wording of items and checking for the “findability” properties of items before implementing a Web-based health knowledge survey may discourage help-seeking, thus reducing measurement error. Studies with larger sample sizes and diverse populations are needed to confirm our results.
Web survey bibliography - Marketing/business (336)
- Achieving Strong Privacy in Online Survey; 2017; Zhou, Yo.; Zhou, Yi.; Chen, S.; Wu, S. S.
- Where, When, How and with What Do Panel Interviews Take Place and Is the Quality of Answers Affected...; 2017; Niebruegge, S.
- Is There a Future for Surveys; 2017; Miller, P. V.
- Mobile Research im Kontext der digitalen Transformation; 2017; Friedrich-Freksa, M.
- Virtual reality meets sensory research; 2017; Depoortere, L.
- Online customer journey analysis: a data science toolbox; 2017; Bonnay, D.
- Comparing Twitter and Online Panels for Survey Recruitment of E-Cigarette Users and Smokers; 2016; Guillory, J.; Kim, A.; Murphy, J.; Bradfield, B.; Nonnemaker, J.; Hsieh, Y. P.
- Statistical Design for Online Experiments Across Desktops, Tablets, Smartphones (and Maybe Wearable...; 2016; Qian, P.; Sadeghi, S.; Arora, N. K.
- FocusVision 2015 Annual MR Technology Report; 2016; Macer, T., Wilson, S.
- The Effects of a Delayed Incentive on Response Rates, Response Mode, Data Quality, and Sample Bias in...; 2016; McGonagle, K., Freedman, V. A.
- A look at the unique data-gathering process behind the Harvard Impact Study; 2016; Vitale, J.
- Are sliders too slick for surveys?; 2016; Buskirk, T. D.
- Evaluating Online Labor Markets for Experimental Research: Amazon.com's Mechanical Turk; 2016; Berinsky, A.; Huber, G. A.; Lenz, G. S.
- Web-based versus Paper-based Survey Data: An Estimation of Road Users’ Value of Travel Time Savings...; 2016; Kato, H.; Sakashita, A.; Tsuchiya, Tak.
- An Examination of Opposing Responses on Duplicated Multi-Mode Survey Responses; 2016; Djangali, A.
- Scientific Surveys Based on Incomplete Sampling Frames and High Rates of Nonresponse; 2016; Fahimi, M.; Barlas, F. M.; Thomas, R. K.; Buttermore, N. R.
- Adapting Labour Force Survey questions from interviewer-administered modes for web self-completion in...; 2015; Betts, P.; Cubbon, B.
- Internet Panels, Professional Respondents, and Data Quality; 2015; Matthijsse, S.; De Leeuw, E. D.; Hox, J.
- Are they willing to use the web? First results of a possible switch from PAPI to CAPI/CAWI in an establishment...; 2015; Ellguth, P.; Kohaut, S.
- GreenBook Research Industry Trends Report; 2015; Murphy, L. (Ed.)
- The role of gamification in better accessing reality and hence increasing data validity ; 2015; Bailey, P.; Kernohan, H.; Pritchard, G.
- Rewarding the Truth; 2015; Puleston, J.
- Impact of raising awareness of respondents on the measurement quality in a web survey; 2015; Revilla, M.
- Email subject lines and response rates to invitations to participate in a web survey and a face-to-face...; 2015; Sappleton, N.; Lourenco, F.
- Can a non-probabilistic online panel achieve question quality similar to that of the European Social...; 2015; Revilla, M.; Saris, W. E.; Loewe, G.; Ochoa, C.
- Mode Effects in Mixed-Mode Economic Surveys: Insights from a Randomized Experiment; 2015; Hsu, J. W.; McFall, B. H.
- Web-based survey, calibration, and economic impact assessment of spending in nature based recreation; 2015; Paudel, K. P., Devkota, N., Gyawali, B.
- The Influence of Answer Box Format on Response Behavior on List-Style Open-Ended Questions; 2014; Keusch, F.
- Improving Survey Response Rates in Online Panels Effects of Low-Cost Incentives and Cost-Free Text Appeal...; 2014; Pedersen, M. J., Nielsen, C. V.
- Matrix versus paging designs in a brand attribution task; 2014; Conrad, F. G., McCullough, W., Nishimura, R.
- Internet-Based Surveys: Methodological Issues; 2014; Albaum, G., Brockett, P., Golden, L., Han, V., Roster, C. A., Smith, S. M., Wiley, J. B.
- Use of a Google Map Tool Embedded in an Internet Survey Instrument: Is it a Valid and Reliable Alternative...; 2014; Dasgupta, S., Vaughan, A. S., Kramer, M. R., Sanchez, T. H., Sullivan, P. S.
- Sequential or Simultaneous Multi-Mode? Results from Two Large Surveys of Electric Utility Consumers; 2014; Jackson, C., Ledoux, C.
- Targeting the bias – the impact of mass media attention on sample composition and representativeness...; 2014; Steinmetz, S., Oez, F., Tijdens, K. G.
- Exploring selection biases for developing countries - is the web a promising tool for data collection...; 2014; Tijdens, K. G., Steinmetz, S.
- Measuring the very long, fuzzy tail in the occupational distribution in web-surveys; 2014; Tijdens, K. G.
- Moving answers with the GyroScale: Using the mobile device’s gyroscope for market research purposes...; 2014; Luetters, H., Kraus, M., Westphal, D.
- Clicking vs. Dragging: Different Uses of the Mouse and Their Implications for Online Surveys; 2014; Sikkel, D., Steenbergen, R., Gras, S.
- Innovation for television research - online surveys via HbbTV. A new technology with fantastic opportunities...; 2014; Herche, J., Adler, M.
- Online mobile surveys in Italy: coverage and other methodological challenges; 2014; Poggio, T.
- How Sliders Bias Survey Data; 2013; Sellers, R.
- Survey Research Response Rates: Internet Technology vs. Snail Mail ; 2013; Lanier, P. A., Tanner, J. R., Totaro, M. W., Gradnigo, G.
- The impact of New Zealand's 2008 prohibition of piperazine-based party pills on young people'...; 2013; Sheridan, J., Dong, C. Y., Butler, R., Barnes, J.
- How well do volunteer web panel surveys measure sensitive behaviours in the general population, and...; 2013; Erens, B., Burkill, S., Copas, A., Couper, M. P., Conrad, F.
- Effects of Gamification on Participation and Data Quality in a Real-World Market Research Domain ; 2013; Cechanowicz, J., Gutwin, C., Brownell, B., Goodfellow, L.
- Ideal participants in online market research: Lessons from closed communities; 2013; Heinze, A., Ferneley, E., Child, P.
- Online, face-to-face and telephone surveys—Comparing different sampling methods in wine consumer...; 2013; Szolnoki, G., Hoffmann, D.
- Where does the Fair Trade price premium go? Confronting consumers' request with reality; 2013; Langen, N., Adenaeuer, L.
- Customer satisfaction in Web 2.0 and information technology development; 2013; Sharma, G., Baoku, L.
- Research staff and public engagement: a UK study; 2013; Davies, S.